Goto

Collaborating Authors

 deep learning container


Achieve 35% faster training with Hugging Face Deep Learning Containers on Amazon SageMaker

#artificialintelligence

Natural language processing (NLP) has been a hot topic in the AI field for some time. As current NLP models get larger and larger, data scientists and developers struggle to set up the infrastructure for such growth of model size. For faster training time, distributed training across multiple machines is a natural choice for developers. However, distributed training comes with extra node communication overhead, which negatively impacts the efficiency of model training. This post shows how to pretrain an NLP model (ALBERT) on Amazon SageMaker by using Hugging Face Deep Learning Container (DLC) and transformers library.


Google Cloud launches Deep Learning Containers in beta

#artificialintelligence

Google Cloud Platform (GCP) today launched Deep Learning Containers, environments optimized for deploying and testing applications and services that utilize machine learning. Now in beta, GCP Deep Learning Containers works in the cloud and on-premises, making it possible to develop or prototype in both. Amazon introduced AWS Deep Learning Containers with Docker image support in March. Google plans for its Deep Learning Containers to "reach parity with all Deep Learning virtual machine types" in the future, according to a blog post sharing the news. The new service includes preconfigured Jupyter and Google Kubernetes Engine (GKE) clusters and launches with machine learning acceleration available from Nvidia GPUs, Intel CPUs, and other hardware.


Train a Deep Graph Network - Amazon SageMaker

#artificialintelligence

In this overview, you learn how to get started with a deep graph network by using one of the DGL containers in Amazon Elastic Container Registry (Amazon ECR). You can also see links to practical examples for deep graph networks. Deep graph networks refer to a type of neural network that is trained to solve graph problems. A deep graph network uses an underlying deep learning framework like PyTorch or MXNet. The potential for graph networks in practical AI applications are highlighted in the Amazon SageMaker tutorials for Deep Graph Library (DGL).


TensorFlow Enterprise Announced; What Does It Mean For Google Cloud

#artificialintelligence

Enterprises of the previous decade have transformed from transactional to digital. Today, digital enterprises use machine learning pipelines with humans-in-loop. However, the enterprises of tomorrow will be aiming for end-to-end AI-driven core business solutions, or intelligent enterprises. To address these demands, Google this week announced TensorFlow Enterprise at the ongoing TensorFlow World conference. TensorFlow, one of the most popular machine learning frameworks, was open sourced by Google in 2015.


Review latest investments to AWS' machine learning platform

#artificialintelligence

AWS is always adding services and features, but it's been particularly active with its AI services of late. For example, AWS rolled out 13 machine learning products at re:Invent 2018 alone. AWS' massive investment in product development is good for users, but this pace of change makes it difficult for IT professionals to keep up. The Deep Learning AMIs and Containers target developers who build sophisticated custom models with the AWS machine learning platform and DevOps teams charged with deploying them on cloud infrastructure. In contrast, the managed services, which join similar products for image recognition, speech transcription and interactive chatbots, are designed for data scientists and non-specialists who want to analyze large and complicated data sets using techniques that are more advanced than standard statistical analysis.


Google Releases Deep Learning Containers into Beta

#artificialintelligence

In a recent blog post, Google announced Deep Learning Containers, allowing customers to get Machine Learning projects up and running quicker. Deep Learning consists of numerous performance-optimized Docker containers that come with a variety of tools necessary for deep learning tasks already installed. Google releases Deep Learning Containers in Beta to provide customers with a way to mitigate the challenge when their development strategy involves a combination of local prototyping and multiple cloud tools, ensuring that all the necessary dependencies are packaged correctly and available to every runtime. With Deep Learning Containers, customers can provision environments consistently for testing and deploying their applications across GCP products and services, like Google Kubernetes Engine (GKE), Cloud Run and Cloud AI Platform Notebooks – hence making it easy for them to scale in the cloud or shift across on-prem environments. Furthermore, Google will provide hardware optimized versions of TensorFlow, regardless if customers are training on NVIDIA GPUs or deploying on Intel CPUs. In the blog post, Mike Cheng, software engineer at Google, explains that each container image provides a Python 3 environment, has a pre-configured Jupyter Notebook, and provides support for the most popular ML frameworks such as Tensorflow, TensorFlow 2.0, PyTorch, and Scikit-learn.


New – AWS Deep Learning Containers Amazon Web Services

#artificialintelligence

We want to make it as easy as possible for you to learn about deep learning and to put it to use in your applications. If you know how to ingest large datasets, train existing models, build new models, and to perform inferences, you'll be well-equipped for the future! New Deep Learning Containers Today I would like to tell you about the new AWS Deep Learning Containers. These Docker images are ready to use for deep learning training or inferencing using TensorFlow or Apache MXNet, with other frameworks to follow. We built these containers after our customers told us that they are using Amazon EKS and ECS to deploy their TensorFlow workloads to the cloud, and asked us to make that task as simple and straightforward as possible.


Amazon's AWS Deep Learning Containers simplify AI app development

#artificialintelligence

Amazon wants to make it easier to get AI-powered apps up and running on Amazon Web Services. Toward that end, it today launched AWS Deep Learning Containers, a library of Docker images preinstalled with popular deep learning frameworks. "We've done all the hard work of building, compiling, and generating, configuring, optimizing all of these frameworks, so you don't have to," Dr. Matt Wood, general manager of deep learning and AI at AWS, said onstage at the AWS Summit in Santa Clara this morning. "And that means that you do less of the undifferentiated heavy lifting of installing these very, very complicated frameworks and then maintaining them." The new AWS container images in question -- which are preconfigured and validated by Amazon -- support Google's TensorFlow machine learning framework and Apache MXNet, with Facebook's PyTorch and other deep learning frameworks to come.


NVIDIAVoice: Lights! Camera! AI! Deep Learning Is Getting Ready for its Close-Up

Forbes - Tech

In order to lose yourself in a great story--whether it's a gaming experience or the next blockbuster superhero movie--you must believe in the world you're entering. Achieving the level of realism necessary to engage audiences often requires sophisticated visual effects and complex animation that involves a great deal of manual input. Deep learning is a subset of AI that is providing game developers, animators, movie makers, and other content creators with inspired shortcuts to complete repetitive tasks much faster, allowing artists to spend more time focusing on valuable creative work. Deep learning works by using layers of mathematics-based computer systems called neural networks to learn a wide variety of complex tasks very rapidly. These networks learn by example, taking in massive amounts of data to recognize patterns and understand how things look and move.